翻訳と辞書
Words near each other
・ Proxify
・ Proxilodon
・ Proxim
・ Proxim Wireless
・ Proxima
・ Proxima Centauri
・ Proxima Centauri (album)
・ Proxima Centauri (disambiguation)
・ Proxima Centauri (short story)
・ Proxima Midnight
・ Proximal 18q-
・ Proximal convoluted tubule
・ Proximal diabetic neuropathy
・ Proximal femoral focal deficiency
・ Proximal gradient method
Proximal gradient methods for learning
・ Proximal radioulnar articulation
・ Proximal renal tubular acidosis
・ Proximal subungual onychomycosis
・ Proximate
・ Proximate and ultimate causation
・ Proximate cause
・ Proximative case
・ Proximedia Group
・ Proximic
・ Proximity (film)
・ Proximity analysis
・ Proximity card
・ Proximity communication
・ Proximity Designs


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Proximal gradient methods for learning : ウィキペディア英語版
Proximal gradient methods for learning
Proximal gradient (forward backward splitting) methods for learning is an area of research in optimization and statistical learning theory which studies algorithms for a general class of convex regularization problems where the regularization penalty may not be differentiable. One such example is \ell_1 regularization (also known as Lasso) of the form
:\min_ \frac\sum_^n (y_i- \langle w,x_i\rangle)^2+ \lambda \|w\|_1, \quad \text x_i\in \mathbb^d\text y_i\in\mathbb.
Proximal gradient methods offer a general framework for solving regularization problems from statistical learning theory with penalties that are tailored to a specific problem application. Such customized penalties can help to induce certain structure in problem solutions, such as ''sparsity'' (in the case of lasso) or ''group structure'' (in the case of group lasso).
== Relevant background ==

Proximal gradient methods are applicable in a wide variety of scenarios for solving convex optimization problems of the form
: \min_ is some set, typically a Hilbert space. The usual criterion of x minimizes F(x)+R(x) if and only if \nabla (F+R)(x) = 0 in the convex, differentiable setting is now replaced by
: 0\in \partial (F+R)(x),
where \partial \varphi denotes the subdifferential of a real-valued, convex function \varphi.
Given a convex function \varphi:\mathcal \to \mathbb an important operator to consider is its proximity operator \operatorname_:\mathcal\to\mathcal defined by
: \operatorname_(u) = \operatorname\min_\|u-x\|_2^2,
which is well-defined because of the strict convexity of the \ell_2 norm. The proximity operator can be seen as a generalization of a projection.〔〔
We see that the proximity operator is important because x^
* is a minimizer to the problem \min__\left(x^
*-\gamma\nabla F(x^
*)\right), where \gamma>0 is any positive real number.〔

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Proximal gradient methods for learning」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.